needle tip
A Deep Learning-Driven Autonomous System for Retinal Vein Cannulation: Validation Using a Chicken Embryo Model
Wang, Yi, Zhang, Peiyao, Esfandiari, Mojtaba, Gehlbach, Peter, Iordachita, Iulian I.
-- Retinal vein cannulation (RVC) is a minimally invasive microsurgical procedure for treating retinal vein occlusion (RVO), a leading cause of vision impairment. However, the small size and fragility of retinal veins, coupled with the need for high-precision, tremor-free needle manipulation, create significant technical challenges. These limitations highlight the need for robotic assistance to improve accuracy and stability. This study presents an automated robotic system with a top-down microscope and B-scan optical coherence tomography (OCT) imaging for precise depth sensing. Deep learning-based models enable real-time needle navigation, contact detection, and vein puncture recognition, using a chicken embryo model as a surrogate for human retinal veins. The experiments demonstrate notable reductions in navigation and puncture times compared to manual methods. Our results demonstrate the potential of integrating advanced imaging and deep learning to automate microsurgical tasks, providing a pathway for safer and more reliable RVC procedures with enhanced precision and reproducibility. I. INTRODUCTION Retinal vein occlusion (RVO) occurs due to the blockage of a retinal vein by a thrombus, leading to transient or permanent vision loss [1]. Current treatments focus on managing complications, but no standardized surgical approach exists for thrombus removal. A 2015 meta-analysis identified RVO as the second most prevalent retinal vascular disease globally, affecting 28.06 million people aged 30-89, including 23.38 million branch RVO (BRVO) and 4.67 million central RVO (CRVO) [2]. Retinal vein cannulation (RVC) involves inserting a micro-needle into the occluded retinal vein, followed by injecting a thrombolytic agent to dissolve the clot [3].
A Feasible Workflow for Retinal Vein Cannulation in Ex Vivo Porcine Eyes with Robotic Assistance
Zhang, Peiyao, Gehlbach, Peter, Kobilarov, Marin, Iordachita, Iulian
A potential Retinal Vein Occlusion (RVO) treatment involves Retinal Vein Cannulation (RVC), which requires the surgeon to insert a microneedle into the affected retinal vein and administer a clot-dissolving drug. This procedure presents significant challenges due to human physiological limitations, such as hand tremors, prolonged tool-holding periods, and constraints in depth perception using a microscope. This study proposes a robot-assisted workflow for RVC to overcome these limitations. The test robot is operated through a keyboard. An intraoperative Optical Coherence Tomography (iOCT) system is used to verify successful venous puncture before infusion. The workflow is validated using 12 ex vivo porcine eyes. These early results demonstrate a successful rate of 10 out of 12 cannulations (83.33%), affirming the feasibility of the proposed workflow.
- North America > United States > Maryland > Baltimore (0.05)
- Asia (0.05)
- Oceania > Australia (0.04)
- Europe (0.04)
- Workflow (1.00)
- Research Report > New Finding (0.67)
Towards Motion Compensation in Autonomous Robotic Subretinal Injections
Arikan, Demir, Zhang, Peiyao, Sommersperger, Michael, Dehghani, Shervin, Esfandiari, Mojtaba, Taylor, Russel H., Nasseri, M. Ali, Gehlbach, Peter, Navab, Nassir, Iordachita, Iulian
Exudative (wet) age-related macular degeneration (AMD) is a leading cause of vision loss in older adults, typically treated with intravitreal injections. Emerging therapies, such as subretinal injections of stem cells, gene therapy, small molecules or RPE cells require precise delivery to avoid damaging delicate retinal structures. Autonomous robotic systems can potentially offer the necessary precision for these procedures. This paper presents a novel approach for motion compensation in robotic subretinal injections, utilizing real-time Optical Coherence Tomography (OCT). The proposed method leverages B$^{5}$-scans, a rapid acquisition of small-volume OCT data, for dynamic tracking of retinal motion along the Z-axis, compensating for physiological movements such as breathing and heartbeat. Validation experiments on \textit{ex vivo} porcine eyes revealed challenges in maintaining a consistent tool-to-retina distance, with deviations of up to 200 $\mu m$ for 100 $\mu m$ amplitude motions and over 80 $\mu m$ for 25 $\mu m$ amplitude motions over one minute. Subretinal injections faced additional difficulties, with horizontal shifts causing the needle to move off-target and inject into the vitreous. These results highlight the need for improved motion prediction and horizontal stability to enhance the accuracy and safety of robotic subretinal procedures.
- Europe > Germany > North Rhine-Westphalia > Upper Bavaria > Munich (0.05)
- Europe > Germany > Bavaria > Upper Bavaria > Munich (0.05)
- North America > United States > Maryland > Baltimore (0.04)
- (2 more...)
MambaXCTrack: Mamba-based Tracker with SSM Cross-correlation and Motion Prompt for Ultrasound Needle Tracking
Zhang, Yuelin, Ding, Qingpeng, Lei, Long, Shan, Jiwei, Xie, Wenxuan, Zhang, Tianyi, Yan, Wanquan, Tang, Raymond Shing-Yan, Cheng, Shing Shin
Ultrasound (US)-guided needle insertion is widely employed in percutaneous interventions. However, providing feedback on the needle tip position via US image presents challenges due to noise, artifacts, and the thin imaging plane of US, which degrades needle features and leads to intermittent tip visibility. In this paper, a Mamba-based US needle tracker MambaXCTrack utilizing structured state space models cross-correlation (SSMX-Corr) and implicit motion prompt is proposed, which is the first application of Mamba in US needle tracking. The SSMX-Corr enhances cross-correlation by long-range modeling and global searching of distant semantic features between template and search maps, benefiting the tracking under noise and artifacts by implicitly learning potential distant semantic cues. By combining with cross-map interleaved scan (CIS), local pixel-wise interaction with positional inductive bias can also be introduced to SSMX-Corr. The implicit low-level motion descriptor is proposed as a non-visual prompt to enhance tracking robustness, addressing the intermittent tip visibility problem. Extensive experiments on a dataset with motorized needle insertion in both phantom and tissue samples demonstrate that the proposed tracker outperforms other state-of-the-art trackers while ablation studies further highlight the effectiveness of each proposed tracking module.
- Asia > China > Hong Kong (0.05)
- Europe > Netherlands > North Holland > Amsterdam (0.04)
- Asia > China > Jiangsu Province (0.04)
- (2 more...)
Real-time Deformation-aware Control for Autonomous Robotic Subretinal Injection under iOCT Guidance
Arikan, Demir, Zhang, Peiyao, Sommersperger, Michael, Dehghani, Shervin, Esfandiari, Mojtaba, Taylor, Russel H., Nasseri, M. Ali, Gehlbach, Peter, Navab, Nassir, Iordachita, Iulian
Robotic platforms provide repeatable and precise tool positioning that significantly enhances retinal microsurgery. Integration of such systems with intraoperative optical coherence tomography (iOCT) enables image-guided robotic interventions, allowing to autonomously perform advanced treatment possibilities, such as injecting therapeutic agents into the subretinal space. Yet, tissue deformations due to tool-tissue interactions are a major challenge in autonomous iOCT-guided robotic subretinal injection, impacting correct needle positioning and, thus, the outcome of the procedure. This paper presents a novel method for autonomous subretinal injection under iOCT guidance that considers tissue deformations during the insertion procedure. This is achieved through real-time segmentation and 3D reconstruction of the surgical scene from densely sampled iOCT B-scans, which we refer to as B5-scans, to monitor the positioning of the instrument regarding a virtual target layer defined at a relative position between the ILM and RPE. Our experiments on ex-vivo porcine eyes demonstrate dynamic adjustment of the insertion depth and overall improved accuracy in needle positioning compared to previous autonomous insertion approaches. Compared to a 35% success rate in subretinal bleb generation with previous approaches, our proposed method reliably and robustly created subretinal blebs in all our experiments.
- Europe > Germany > Bavaria > Upper Bavaria > Munich (0.05)
- Europe > Germany > North Rhine-Westphalia > Upper Bavaria > Munich (0.04)
- North America > United States > Maryland > Baltimore (0.04)
- (2 more...)
Shape Manipulation of Bevel-Tip Needles for Prostate Biopsy Procedures: A Comparison of Two Resolved-Rate Controllers
Wang, Yanzhou, Al-Zogbi, Lidia, Liu, Jiawei, Shepard, Lauren, Ghazi, Ahmed, Tokuda, Junichi, Leonard, Simon, Krieger, Axel, Iordachita, Iulian
Prostate cancer diagnosis continues to encounter challenges, often due to imprecise needle placement in standard biopsies. Several control strategies have been developed to compensate for needle tip prediction inaccuracies, however none were compared against each other, and it is unclear whether any of them can be safely and universally applied in clinical settings. This paper compares the performance of two resolved-rate controllers, derived from a mechanics-based and a data-driven approach, for bevel-tip needle control using needle shape manipulation through a template. We demonstrate for a simulated 12-core biopsy procedure under model parameter uncertainty that the mechanics-based controller can better reach desired targets when only the final goal configuration is presented even with uncertainty on model parameters estimation, and that providing a feasible needle path is crucial in ensuring safe surgical outcomes when either controller is used for needle shape manipulation.
- North America > United States > Maryland > Baltimore (0.05)
- North America > United States > Massachusetts > Suffolk County > Boston (0.04)
- Asia > China > Hong Kong (0.04)
- Health & Medicine > Diagnostic Medicine > Biopsy (0.83)
- Health & Medicine > Therapeutic Area > Oncology > Prostate Cancer (0.35)
Bevel-Tip Needle Deflection Modeling, Simulation, and Validation in Multi-Layer Tissues
Wang, Yanzhou, Al-Zogbi, Lidia, Liu, Guanyun, Liu, Jiawei, Tokuda, Junichi, Krieger, Axel, Iordachita, Iulian
Percutaneous needle insertions are commonly performed for diagnostic and therapeutic purposes as an effective alternative to more invasive surgical procedures. However, the outcome of needle-based approaches relies heavily on the accuracy of needle placement, which remains a challenge even with robot assistance and medical imaging guidance due to needle deflection caused by contact with soft tissues. In this paper, we present a novel mechanics-based 2D bevel-tip needle model that can account for the effect of nonlinear strain-dependent behavior of biological soft tissues under compression. Real-time finite element simulation allows multiple control inputs along the length of the needle with full three-degree-of-freedom (DOF) planar needle motions. Cross-validation studies using custom-designed multi-layer tissue phantoms as well as heterogeneous chicken breast tissues result in less than 1mm in-plane errors for insertions reaching depths of up to 61 mm, demonstrating the validity and generalizability of the proposed method.
- North America > United States > Massachusetts > Suffolk County > Boston (0.04)
- North America > United States > Maryland > Baltimore (0.04)
- Europe > Germany (0.04)
- Asia > China > Hong Kong (0.04)
- Research Report > New Finding (1.00)
- Research Report > Experimental Study (0.68)
- Health & Medicine > Therapeutic Area > Oncology (1.00)
- Health & Medicine > Health Care Technology (0.88)
- Health & Medicine > Diagnostic Medicine > Imaging (0.66)
EyeLS: Shadow-Guided Instrument Landing System for Intraocular Target Approaching in Robotic Eye Surgery
Yang, Junjie, Zhao, Zhihao, Shen, Siyuan, Zapp, Daniel, Maier, Mathias, Huang, Kai, Navab, Nassir, Nasseri, M. Ali
Robotic ophthalmic surgery is an emerging technology to facilitate high-precision interventions such as retina penetration in subretinal injection and removal of floating tissues in retinal detachment depending on the input imaging modalities such as microscopy and intraoperative OCT (iOCT). Although iOCT is explored to locate the needle tip within its range-limited ROI, it is still difficult to coordinate iOCT's motion with the needle, especially at the initial target-approaching stage. Meanwhile, due to 2D perspective projection and thus the loss of depth information, current image-based methods cannot effectively estimate the needle tip's trajectory towards both retinal and floating targets. To address this limitation, we propose to use the shadow positions of the target and the instrument tip to estimate their relative depth position and accordingly optimize the instrument tip's insertion trajectory until the tip approaches targets within iOCT's scanning area. Our method succeeds target approaching on a retina model, and achieves an average depth error of 0.0127 mm and 0.3473 mm for floating and retinal targets respectively in the surgical simulator without damaging the retina.
- Europe > Germany > Bavaria > Upper Bavaria > Munich (0.04)
- Asia > China > Guangdong Province > Guangzhou (0.04)
Fast calibration for ultrasound imaging guidance based on depth camera
Zhao, Fuqiang, Li, Mingchang, Li, Mengde, Fu, Zhongtao, Li, Miao
During the process of robot-assisted ultrasound(US) puncture, it is important to estimate the location of the puncture from the 2D US images. To this end, the calibration of the US image becomes an important issue. In this paper, we proposed a depth camera-based US calibration method, where an easy-to-deploy device is designed for the calibration. With this device, the coordinates of the puncture needle tip are collected respectively in US image and in the depth camera, upon which a correspondence matrix is built for calibration. Finally, a number of experiments are conducted to validate the effectiveness of our calibration method.
- North America > United States (0.38)
- Asia > China > Hubei Province > Wuhan (0.05)
- Europe > Germany > Bavaria > Upper Bavaria > Munich (0.04)
- Health & Medicine > Therapeutic Area (0.48)
- Health & Medicine > Diagnostic Medicine > Imaging (0.47)
Collaborative Robotic Biopsy with Trajectory Guidance and Needle Tip Force Feedback
Mieling, Robin, Neidhardt, Maximilian, Latus, Sarah, Stapper, Carolin, Gerlach, Stefan, Kniep, Inga, Heinemann, Axel, Ondruschka, Benjamin, Schlaefer, Alexander
The diagnostic value of biopsies is highly dependent on the placement of needles. Robotic trajectory guidance has been shown to improve needle positioning, but feedback for real-time navigation is limited. Haptic display of needle tip forces can provide rich feedback for needle navigation by enabling localization of tissue structures along the insertion path. We present a collaborative robotic biopsy system that combines trajectory guidance with kinesthetic feedback to assist the physician in needle placement. The robot aligns the needle while the insertion is performed in collaboration with a medical expert who controls the needle position on site. We present a needle design that senses forces at the needle tip based on optical coherence tomography and machine learning for real-time data processing. Our robotic setup allows operators to sense deep tissue interfaces independent of frictional forces to improve needle placement relative to a desired target structure. We first evaluate needle tip force sensing in ex-vivo tissue in a phantom study. We characterize the tip forces during insertions with constant velocity and demonstrate the ability to detect tissue interfaces in a collaborative user study. Participants are able to detect 91% of ex-vivo tissue interfaces based on needle tip force feedback alone. Finally, we demonstrate that even smaller, deep target structures can be accurately sampled by performing post-mortem in situ biopsies of the pancreas.
- North America > United States > New York (0.04)
- Europe > Switzerland > Basel-City > Basel (0.04)
- Europe > Germany > Hamburg (0.04)
- (2 more...)
- Research Report > Experimental Study (1.00)
- Research Report > New Finding (0.93)